Generalization of Jeffreys’ Divergence Based Priors for Bayesian Hypothesis testing
نویسنده
چکیده
In this paper we introduce objective proper prior distributions for hypothesis testing and model selection based on measures of divergence between the competing models; we call them divergence based (DB) priors. DB priors have simple forms and desirable properties, like information (finite sample) consistency; often, they are similar to other existing proposals like the intrinsic priors; moreover, in normal linear models scenarios, they exactly reproduce Jeffreys-Zellner-Siow priors. Most importantly, in challenging scenarios such as irregular models and mixture models, the DB priors are well defined and very reasonable, while alternative proposals are not. We derive approximations to the DB priors as well as MCMC and asymptotic expressions for the associated Bayes factors.
منابع مشابه
Shrinkage Priors for Bayesian Prediction
We investigate shrinkage priors for constructing Bayesian predictive distributions. It is shown that there exist shrinkage predictive distributions asymptotically dominating Bayesian predictive distributions based on the Jeffreys prior or other vague priors if the model manifold satisfies some differential geometric conditions. Kullback– Leibler divergence from the true distribution to a predic...
متن کاملLocation Reparameterization and Default Priors for Statistical Analysis
This paper develops default priors for Bayesian analysis that reproduce familiar frequentist and Bayesian analyses for models that are exponential or location. For the vector parameter case there is an information adjustment that avoids the Bayesian marginalization paradoxes and properly targets the prior on the parameter of interest thus adjusting for any complicating nonlinearity the details ...
متن کاملBayesian Sample size Determination for Longitudinal Studies with Continuous Response using Marginal Models
Introduction Longitudinal study designs are common in a lot of scientific researches, especially in medical, social and economic sciences. The reason is that longitudinal studies allow researchers to measure changes of each individual over time and often have higher statistical power than cross-sectional studies. Choosing an appropriate sample size is a crucial step in a successful study. A st...
متن کاملSimultaneous Prediction of Independent Poisson Observables
Simultaneous predictive distributions for independent Poisson observables are investigated. A class of improper prior distributions for Poisson means is introduced. The Bayesian predictive distributions based on priors from the introduced class are shown to be admissible under the Kullback–Leibler loss. A Bayesian predictive distribution based on a prior in this class dominates the Bayesian pre...
متن کاملApplication of Kähler manifold to signal processing and Bayesian inference
We review the information geometry of linear systems and its application to Bayesian inference, and the simplification available in the Kähler manifold case. We find conditions for the information geometry of linear systems to be Kähler, and the relation of the Kähler potential to information geometric quantities such as α-divergence, information distance and the dual αconnection structure. The...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2008